Shares

It can be almost creepy how much tracking algorithms can know about us just through our shopping habits. In an interesting case related in a Forbes article, Target started sending coupons for pregnancy-related items to women whose shopping habits indicated they were likely pregnant, even before they told their family (leading to at least one embarrassing revelation). This is indicative of a couple of basic principles. The first is that we are all more predictable than we would like to believe. Second, in a world of big data and artificial intelligence, privacy is likely a thing of the past.

But these are just powerful tools, and the same tools can be used for good, selfish, or nefarious purposes. Tracking lots of data about us in order to predict health outcomes and personalize medical interventions would be a good use of this technology (and predictably is lagging behind the purely marketing applications). Leveraging monitoring data for health care is also helped by increasingly portable devices for monitoring physiological parameters and behavior. Most of us now carry a powerful computer with us wherever we go (our smartphones), equipped with lots of sensors and the ability to add more for specific applications.

AI and mental health

A collaboration between MIT’s Rosalind Picard and Massachusetts General Hospital’s Paola Pedrelli is attempting to leverage AI algorithms and smartphone technology as an aid to help monitor and treat mental illness. This could be a feasible application, since behavior can be a major sign of mental illness and behavior can be tracked. There are already apps that monitor our sleep, and fitness devices that track our heart rate. Skin conductance can also be easily monitored. The GPS in the phone can track how often you leave your house, and the accelerometer can track how much you move and exercise.

Picard and Pedrelli are using machine learning to feed this kind of data to an AI so that it can learn which behaviors and physiological parameters correlate with depression. They have enrolled 48 participants with depression so far:

For 22 hours per day, every day for 12 weeks, participants wear Empatica E4 wristbands. These wearable wristbands, designed by one of the companies Picard founded, can pick up information on biometric data, like electrodermal (skin) activity. Participants also download apps on their phone which collect data on texts and phone calls, location, and app usage, and also prompt them to complete a biweekly depression survey.
Every week, patients check in with a clinician who evaluates their depressive symptoms.

Right now they are just gathering data, teaching the AI to detect when a change is occurring in a subject that would indicate their depression is worsening or they are having an acute depressive episode. This is not just group data, but individual data, with the goal of determining for each individual subject which behavioral changes correlate with worsening depression.

They also speculate that this kind of data may be useful to recommend behavioral interventions. For example, if a subject is sleeping less and feeling more depressed, they could be told to try to sleep more. Of course, this would only work if the lack of sleep is causing depression, rather than the other way around, or some third variable both causing poor sleep and worsening depression. Collecting lots of data and feeding it to a learning algorithm might be the easy part. It would take careful clinical trials to determine cause and effect (not just correlation) and determine specific interventions.

This study is just giving us a glimpse at the potential here, and it will take a lot of research to know how best to apply it. It’s easy to imagine, however, mental fitness apps (just like physical fitness apps) that monitor relevant parameters and give feedback to the user. This could be helpful or harmful, however. This could exacerbate anxiety about health or feed obsessive behavior. It might also give people a false sense of security, delaying them seeking professional mental health.

When used responsibly and in an evidence-based manner, however, this technology could also be used to alert one’s mental health-care provider of impending worsening depression, or even suicide risk. It could also be used to give some objective feedback on the effects of specific interventions, such as cognitive behavioral therapy or medications.

However, I have a major concern about this technology, specifically how it will be abused. As we have seen in the pages of SBM over the years, every new medical technology comes with a host of quack knock-offs. In some cases, the fraudulent use predates the scientific use by years, such as with the stem-cell clinics that rode the wave of hype about stem cell therapies, making promises years or decades ahead of the science.

In this case it is easy to predict the proliferation of health and mental health apps that make all sorts of promises about outcomes not based in evidence. But they can be fueled with some off-the-shelf AI that can find correlations that give the products an air of legitimacy. Confusing correlation with causation will be a feature, not a bug. Slightly more sophisticated versions may even be used by health care professionals, looking for more options and willing to get ahead of the evidence. Abuses and misuses like this are almost certain to happen, and we simply do not have the regulatory infrastructure to stop it.

This is a general problem with our medical infrastructure. In the early phases of a new technology it is easy to exploit preliminary evidence, leap over the necessary years of clinical trials, and go right to making unjustified clinical claims. There is now a cottage industry for doing just that. While medical professionals are mostly better than this, we also have a problem with premature adoption of new technology, techniques, and products before sufficient high-quality evidence is available.

I do think that using AI machine learning will be a great tool for general and mental health, when properly used. In the future AIs will likely be increasingly-integrated into our health care system, with great benefit. But in the meanwhile we have to pass through a phase of premature adoption and quack knock-offs.

Shares

Author

Posted by Steven Novella